TikTok and privacy for minors: all the questions to be solved

Complaints made by the UK Privacy Authority (ICO) against TikTok have brought to light some critical issues in compliance with data protection legislation, in particular the failure to respect children’s privacy. Let’s take stock.

At the end of September, the UK Information Commissioner’s Officer (ICO) stated that it had issued a Notice of Intent to TikTok Inc. and TikTok Information Technologies UK Limited stating that they could face a £27 million fine for the TikTok app’s breach of UK data protection law.

The Notice of Intent is a measure that, following investigations conducted by the ICO, relating to the period between May 2018 and July 2020, highlights certain critical issues with respect to the provisions of the applicable privacy law, thus requiring appropriate measures to be taken, failing which the prescribed penalty will be imposed.

TikTok and privacy for minors: the challenges

The objections relate specifically to the lack of respect for children’s privacy; in particular, the critical points that have been highlighted by the ICO concern:

  1. the lack of consent of those exercising parental authority for users under the age of thirteen;
  2. the absence of adequate information (i.e. which was concise, transparent and easily understood by users);
  3. the processing of data falling into special categories (such as ethnic and racial origin, sexual orientation, genetic, biometric and health data, political and religious opinions) without due prior express consent.

The fact that the penalty could be the highest ever issued by the ICO makes clear the seriousness of the charges against TikTok.

First of all, the supervisory authority considered the nature of the persons involved, the data subjects, whose data were unlawfully processed. Since they belong to weak categories (minors, in fact), the danger of violation of their rights was considered to be greater, especially because of the possible exposure that social networking (the most widespread in the world) allows.

Exposure that becomes even more dangerous for some of the data collected that precisely fall within the so-called sensitive or special data. Lastly, the amount of the penalty hypothesised has also been parameterised according to the period of the violations (almost two years ascertained), the offender and his turnover.

On the other hand, the issue of the clarity of the disclosure might appear of minor importance. In fact, on this point, all the independent Authorities protecting personal data have pointed out on several occasions, in the course of the investigations conducted, that an information notice that is inadequate to the particular characteristics of the subject to whom it is addressed is not suitable for safeguarding the fundamental primary rights of protection since, if not expressed in clear language, it thwarts the very function of enabling the understanding and proper exercise of rights.

TikTok’s reaction

The companies involved must now put forward their arguments and, if they see fit, take those measures that have not already been implemented to remedy the charges against them.

At the moment, there are no official statements from TikTok on the individual contested facts. Spokespersons for the app have merely stated that, while they respect the ICO’s role in safeguarding privacy in the UK, they disagree with the preliminary views expressed and intend to respond formally to the ICO. But the case is similar to other measures that TikTok has received in recent times from other regulators for similar critical issues.

In fact, TikTok’s controlling legal entity, Byte Dance (a Chinese conglomerate partly controlled by the Chinese government), has already had to deal with privacy-related requests, inspections and measures from many other countries: U.S.A., Australia, India, the Netherlands, France, Italy and now the U.K., to name but a few.

This scrutiny of TikTok is likely to be particularly pervasive, compared to other similar social networks and apps that also present similar critical issues in terms of protecting the privacy of minors, also because TikTok collects a huge amount of data and of different types (email and phone number for registration links to the accounts of other social platforms, location, audio recording, access to the camera and contacts), and there are fears that this data could be shared with the Chinese government, not only through conscious use of the app but also through surveillance software, since a company in China, by law, can be forced to provide data to the government if requested to do so.

ICO evaluations also in line with GDPR

Even though the data protection legislation currently applied in the UK is not exactly the same as that in force in the EEA (European Economic Area), the evaluations made by the ICO are entirely similar to those represented by the Italian Data Protection Authority, which has already conducted (in 2021) an audit on the platform, which concluded with the same statement of insufficient attention to the protection of minors in the field of personal data protection.

Also in that case, in fact, it emerged that the ban on registration for minors under 13 years of age was easily circumvented (false account with incorrect declaration of date of birth) and the principles of privacy by design and privacy by default were not respected by virtue of the app’s default settings, which automatically attributed the profile as public even to minors, thus allowing even the very young a maximum visibility by default.

Our Privacy Guarantor also highlighted TikTok’s non-compliance with Article 13 of Regulation (EU) 2016/679 (the so-called GDPR) in relation to the adjustment of the information notice, which cannot be standardised for adults and children/young people, but must be customised in language and form suitable for the latter, since the information notice, the Guarantor recalled, must be concise, transparent, intelligible to the interested party and easily accessible, using clear and simple language, and suitable information notices must be provided for minors.

A further element of sensitivity is connected with the processing of data in different countries and continents, with the consequence of verifying, especially under the GDPR provisions on the circulation of data outside the EU, the organisational measures and safeguards applicable in the various countries where the data collected by TikTok are disclosed and profiled.

The algorithms for this and the methods of anonymisation, which even social networks like TikTok claim to apply, are in fact not known and are constantly changing, making effective control over them almost impossible.

TikTok’s moves and possible developments

TikTok has declared its intention to adopt a series of measures to comply with the requests to block access to users under 13 years of age, also with the use of artificial intelligence systems for age verification, anticipating a project in collaboration with the Privacy Authority of Ireland (where TikTok has its plant and headquarters in Europe).

At the moment, TikTok has prepared to improve its Privacy Policy on the app for underage users by making it simpler and more understandable. The Privacy Policy currently in force (latest version is dated 5/10/2021) for the EEA, UK and Switzerland, explains settings on private accounts, how to suggest the account to others, settings for video downloads, enabling direct messages and video duets, creating stitches, comments and more.

In addition, there are further aspects that, especially for minors, need to be verified and improved in order to allow a real and substantial respect of the rights, towards minors, sanctioned by Art. 16 et seq. of the GDPR: the modalities for exercising the rights of rectification, oblivion, limitation of processing, data portability, and opposition (including to automated decision-making processes) must be simplified for minor users, in the same way as the simplification already required for information notices.

An overall assessment of the measures adopted by TikTok to disable some sharing functionalities that are also available to minors would require an in-depth study by the Supervisory Authorities, coordinated at least at European level, after verifying the display algorithms and the service options that are active and can be activated.

Finally, coordinated action on the part of the EDPB (the European Supervisory Authority), which is now possible thanks to shared legislation, at least for the EEA, is and will be indispensable to supervise the real adoption of the required security measures and a real improvement of information and communication to minors.

This article is a free translation of the article written by Senior Partner Ugo Ettore di Stefano from our Italian member firm Lexellent, and published on 19 October 2022 by Cybersecurity360, the cyber security publication of the Digital360 group. To read the full original article in Italian, click here.